• Time-MoE is a project hosted on GitHub that focuses on developing billion-scale time series foundation models utilizing a mixture-of-experts architecture. This innovative approach allows for auto-regressive forecasting, accommodating various prediction horizons and context lengths of up to 4096. The repository includes several model variants, such as Time-MoE (base), Time-MoE (large), and Time-MoE (ultra), with parameter counts ranging from 50 million to 2.4 billion. The project is built on a foundation of extensive training data, with a dataset named Time-300B expected to be released soon. Users can easily get started by installing the necessary dependencies, including Python 3.10 and specific versions of libraries like transformers. The repository provides clear instructions for making forecasts using the models, including code snippets for both normalized and non-normalized input sequences. For evaluation purposes, users can access benchmark datasets and run evaluation scripts to assess model performance on specific datasets, such as ETTh1. The project encourages users to cite the associated research paper if they find the models beneficial for their work, and it provides links to related resources and papers that further explore the intersection of large language models and time series analysis. The repository is licensed under the Apache-2.0 License, and it acknowledges contributions from various GitHub repositories that have influenced its development. Overall, Time-MoE represents a significant advancement in the field of time series forecasting, leveraging cutting-edge machine learning techniques to enhance predictive capabilities.